-
Notifications
You must be signed in to change notification settings - Fork 19.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Silu simpler #19417
Silu simpler #19417
Conversation
The beta param was only accepted on the tensorflow/torch backends and not in the `keras.ops` API, nor was it tested. I think best just to ditch, since no one could be relying on it.
eb7d2a8
to
c008881
Compare
@tirthasheshpatel does this make sense to you? Seems best to rely on the torch op directly if we can. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good, thanks for the fix @mattdangerw!
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #19417 +/- ##
==========================================
+ Coverage 75.97% 75.98% +0.01%
==========================================
Files 366 366
Lines 40740 40759 +19
Branches 7944 7946 +2
==========================================
+ Hits 30952 30971 +19
Misses 8075 8075
Partials 1713 1713
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
x = convert_to_tensor(x) | ||
return x * sigmoid(beta * x) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we keep the beta argument around, and only use the tnn implementation when beta == 1?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we want to expose it? The weird part was that this didn't exist on the actual keras.ops
signature or jax or numpy.
That backend should match the ops
signature right? We don't want secret extra options handing around on specific backends.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If it's not in the op
signature we can drop it.
* Refactor dtypes in codebase and add float8_* dtypes * Update comments Fix for JAX export on GPU. (keras-team#19404) Fix formatting in export_lib. (keras-team#19405) `ops/numpy.py`: Support `key` as `list` in `GetItem` (keras-team#19310) When loading a model that contains `GetItem` nodes with multidimensional indices/slices as `key`, the `key` argument is loaded from JSON as a `list`, not a `tuple` (because JSON does not have the distinction). So, treat the `key list` as equivalent to the `key tuple`. Copying is important: otherwise, the later `pop()` will remove the bound slice elements from the op itself. `saving/serialization_lib_test.py`: * Add `test_numpy_get_item_layer()`: test for consistent serialization/deserialization of a model which contains `ops.numpy.GetItem`; feat(losses): add Dice loss implementation (keras-team#19409) * feat(losses): add Dice loss implementation * removed smooth parameter and type casting * adjusted casting and dot operator Update casting Bump the github-actions group with 1 update (keras-team#19412) Bumps the github-actions group with 1 update: [github/codeql-action](https://github.com/github/codeql-action). Updates `github/codeql-action` from 3.24.6 to 3.24.9 - [Release notes](https://github.com/github/codeql-action/releases) - [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md) - [Commits](github/codeql-action@8a470fd...1b1aada) --- updated-dependencies: - dependency-name: github/codeql-action dependency-type: direct:production update-type: version-update:semver-patch dependency-group: github-actions ... Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Fix issue with shared layer deserialization Remove dead code in saving lib (keras-team#19415) Remove unused beta param for silu, use torch op directly (keras-team#19417) The beta param was only accepted on the tensorflow/torch backends and not in the `keras.ops` API, nor was it tested. I think best just to ditch, since no one could be relying on it. Fix print_fn for custom function (keras-team#19419) Add fp8 to `EinsumDense` Add test script
No description provided.